# Weight-Sharing Architecture
Albert Base V1
Apache-2.0
ALBERT is a lightweight pre-trained language model based on the Transformer architecture, trained on English text through self-supervised learning with parameter-sharing features to reduce memory usage.
Large Language Model
Transformers English

A
albert
18.34k
11
Albert Large V1
Apache-2.0
ALBERT is a lightweight BERT variant pre-trained on English corpora, reducing memory usage through parameter sharing, supporting masked language modeling and sentence order prediction tasks.
Large Language Model
Transformers English

A
albert
979
3
Featured Recommended AI Models